On model selection consistency of regularized M-estimators

نویسندگان

  • Jason D. Lee
  • Yuekai Sun
  • Jonathan E. Taylor
چکیده

Penalized M-estimators are used in many areas of science and engineering to fit models with some low-dimensional structure in high-dimensional settings. In many problems arising in machine learning, signal processing, and high-dimensional statistics, the penalties are geometrically decomposable, i.e. can be expressed as a sum of support functions. We generalize the notion of irrepresentability and develop a general framework for establishing the model selection consistency of M-estimators with these penalties. We then use this framework to derive results for some special cases of interest in machine learning and high-dimensional statistics.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparsistency of 1-Regularized M-Estimators

We consider the model selection consistency or sparsistency of a broad set of ` 1 regularized M -estimators for linear and nonlinear statistical models in a unified fashion. For this purpose, we propose the local structured smoothness condition (LSSC) on the loss function. We provide a general result giving deterministic su cient conditions for sparsistency in terms of the regularization parame...

متن کامل

Dimension Reduction and Variable Selection in Case Control Studies via Regularized Likelihood Optimization

Dimension reduction and variable selection are performed routinely in case-control studies, but the literature on the theoretical aspects of the resulting estimates is scarce. We bring our contribution to this literature by studying estimators obtained via l1 penalized likelihood optimization. We show that the optimizers of the l1 penalized retrospective likelihood coincide with the optimizers ...

متن کامل

Identity for the NPMLE in censored data models.

We derive an identity for nonparametric maximum likelihood estimators (NPMLE) and regularized MLEs in censored data models which expresses the standardized maximum likelihood estimators in terms of the standardized empirical process. This identity provides an effective starting point in proving both consistency and efficiency of NPMLE and regularized MLE. The identity and corresponding method f...

متن کامل

A unified framework for high-dimensional analysis of $M$-estimators with decomposable regularizers

High-dimensional statistical inference deals with models in which the the number of parameters p is comparable to or larger than the sample size n. Since it is usually impossible to obtain consistent procedures unless p/n → 0, a line of recent work has studied models with various types of low-dimensional structure, including sparse vectors, sparse and structured matrices, low-rank matrices, and...

متن کامل

On model selection consistency of M-estimators with geometrically decomposable penalties

Penalized M-estimators are used in diverse areas of science and engineering to fit high-dimensional models with some low-dimensional structure. Often, the penalties are geometrically decomposable, i.e. can be expressed as a sum of support functions over convex sets. We generalize the notion of irrepresentable to geometrically decomposable penalties and develop a general framework for establishi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1305.7477  شماره 

صفحات  -

تاریخ انتشار 2013